duality-induced regularizer
Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion Supplementary Material
Theorem 1. Suppose that ˆ X In DB models, the commonly used p is either 1 or 2. When p = 2, DURA takes the form as the one in Equation (8) in the main text. If p = 1, we cannot expand the squared score function of the associated DB models as in Equation (4). Therefore, we choose p = 2 . 2 Table 2: Hyperparameters found by grid search. Suppose that k is the number of triplets known to be true in the knowledge graph, n is the embedding dimension of entities. That is to say, the computational complexity of weighted DURA is the same as the weighted squared Frobenius norm regularizer.
Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion Supplementary Material
Theorem 1. Suppose that ˆ X In DB models, the commonly used p is either 1 or 2. When p = 2, DURA takes the form as the one in Equation (8) in the main text. If p = 1, we cannot expand the squared score function of the associated DB models as in Equation (4). Therefore, we choose p = 2 . 2 Table 2: Hyperparameters found by grid search. Suppose that k is the number of triplets known to be true in the knowledge graph, n is the embedding dimension of entities. That is to say, the computational complexity of weighted DURA is the same as the weighted squared Frobenius norm regularizer.
Review for NeurIPS paper: Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion
Additional Feedback: - Line 70: DB methods are based on Minkowski distance, however, in this paper the duality is stablished only for the case of Frobenius norm, i.e. Minkowski distance with p 2. It would be nice that authors provide a deeper explanation about the role on parameter p in DB methods. What is the optimal value of p in state of the arts methods? The sentence: "the regularizer 5 and 6" should be changed to "the regularizer 4 and 5" (check equation numbering) - Line 180: As a regularizer having several terms, it would be convenient to consider different regularizer coefficients as hyperparameters. In fact, in supplemental material (lines 36-37) the cost function has 3 hyperparameters: lambda, lambda_1 and lambda_2.
Review for NeurIPS paper: Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion
There are roughly two different approaches in the literature for knowledge graph completion (KGC), namely distance based (DB) models and tensor factorization based (TFB) models. Although both approaches have their own advantages and disadvantages over each other, TFB models cannot attain state-of-the-art performance due to overfitting problem, and therefore various regularizers are employed for TFB models. In the paper, authors propose a regularizer for TFB models, namely Duality-induced Regularization (DURA), which is inspired by the score functions of the DB models. They come up with a dual problem which involves a distance based KGC model, and show that when the aforementioned regularizer is employed for the primal problem (i.e. TFB model), both problems become equivalent.
Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion
Tensor factorization based models have shown great power in knowledge graph completion (KGC). However, their performance usually suffers from the overfitting problem seriously. This motivates various regularizers---such as the squared Frobenius norm and tensor nuclear norm regulariers---while the limited applicability significantly limits their practical usage. To address this challenge, we propose a novel regularizer---namely, \textbf{DU}ality-induced \textbf{R}egul\textbf{A}rizer (DURA)---which is not only effective in improving the performance of existing models but widely applicable to various methods. The major novelty of DURA is based on the observation that, for an existing tensor factorization based KGC model (\textit{primal}), there is often another distance based KGC model (\textit{dual}) closely associated with it.